home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Internet Surfer 2.0
/
Internet Surfer 2.0 (Wayzata Technology) (1996).iso
/
pc
/
text
/
mac
/
faqs.079
< prev
next >
Wrap
Text File
|
1996-02-12
|
29KB
|
690 lines
Frequently Asked Questions (FAQS);faqs.079
12. Aspirin/MIGRAINES
Aspirin/MIGRAINES 6.0 consists of a code generator that builds neural network
simulations by reading a network description (written in a language
called "Aspirin") and generates a C simulation. An interface
(called "MIGRAINES") is provided to export data from the neural
network to visualization tools.
The system has been ported to a large number of platforms.
The goal of Aspirin is to provide a common extendible front-end language
and parser for different network paradigms.
The MIGRAINES interface is a terminal based interface
that allows you to open Unix pipes to data in the neural
network. This replaces the NeWS1.1 graphical interface
in version 4.0 of the Aspirin/MIGRAINES software. The
new interface is not a simple to use as the version 4.0
interface but is much more portable and flexible.
The MIGRAINES interface allows users to output
neural network weight and node vectors to disk or to
other Unix processes. Users can display the data using
either public or commercial graphics/analysis tools.
Example filters are included that convert data exported through
MIGRAINES to formats readable by Gnuplot 3.0, Matlab, Mathematica,
and xgobi.
The software is available from two FTP sites:
CMU's simulator collection on "pt.cs.cmu.edu" (128.2.254.155)
in /afs/cs/project/connect/code/am6.tar.Z".
and UCLA's cognitive science machine "ftp.cognet.ucla.edu" (128.97.50.19)
in alexis/am6.tar.Z
The compressed tar file is a little less than 2 megabytes.
13. Adaptive Logic Network kit
Available from menaik.cs.ualberta.ca. This package differs from
the traditional nets in that it uses logic functions rather than
floating point; for many tasks, ALN's can show many orders of
magnitude gain in training and performance speed.
Anonymous ftp from menaik.cs.ualberta.ca [129.128.4.241]
unix source code and examples: /pub/atree2.tar.Z (145 KB)
Postscript documentation: /pub/atree2.ps.Z ( 76 KB)
MS-DOS Windows 3.0 version: /pub/atree2.zip (353 KB)
/pub/atree2zip.readme (1 KB)
14. NeuralShell
Availible from FTP site quanta.eng.ohio-state.edu
(128.146.35.1) in directory "pub/NeuralShell", filename
"NeuralShell.tar".
15. PDP
The PDP simulator package is available via anonymous FTP at
nic.funet.fi (128.214.6.100) in /pub/sci/neural/sims/pdp.tar.Z (0.2 MB)
The simulator is also available with the book
"Explorations in Parallel Distributed Processing: A Handbook of
Models, Programs, and Exercises" by McClelland and Rumelhart.
MIT Press, 1988.
Comment: "This book is often referred to as PDP vol III which is a very
misleading practice! The book comes with software on an IBM disk but
includes a makefile for compiling on UNIX systems. The version of
PDP available at nic.funet.fi seems identical to the one with the book
except for a bug in bp.c which occurs when you try to run a script of
PDP commands using the DO command. This can be found and fixed easily."
16. Xerion
Xerion is available via anonymous ftp from
ftp.cs.toronto.edu in the directory /pub/xerion.
xerion-3.0.PS.Z (0.9 MB) and xerion-3.0.tar.Z (1.1 MB) plus
several concrete simulators built with xerion (about 0.3 MB each,
see below).
Xerion runs on SGI and Sun machines and uses X Windows for graphics.
The software contains modules that implement Back Propagation,
Recurrent Back Propagation, Boltzmann Machine, Mean Field Theory,
Free Energy Manipulation, Hard and Soft Competitive Learning, and
Kohonen Networks. Sample networks built for each of the modules are
also included.
Contact: xerion@ai.toronto.edu
17. Neocognitron simulator
An implementation is available for anonymous ftp at
[128.194.15.32] tamsun.tamu.edu as /pub/neocognitron.Z.tar
The simulator is written in C and comes with a list of references
which are necessary to read to understand the specifics of the
implementation. The unsupervised version is coded without (!)
C-cell inhibition.
18. Multi-Module Neural Computing Environment (MUME)
MUME is a simulation environment for multi-modules neural computing. It
provides an object oriented facility for the simulation and training
of multiple nets with various architectures and learning algorithms.
MUME includes a library of network architectures including feedforward,
simple recurrent, and continuously running recurrent neural networks.
Each architecture is supported by a variety of learning algorithms.
MUME can be used for large scale neural network simulations as it provides
support for learning in multi-net environments. It also provide pre- and
post-processing facilities.
The modules are provided in a library. Several "front-ends" or clients are
also available.
MUME can be used to include non-neural computing modules (decision
trees, ...) in applications.
The software is the product of a number of staff and postgraduate students
at the Machine Intelligence Group at Sydney University Electrical
Engineering.
The software is written in 'C' and is being used on Sun and DEC
workstations. Efforts are underway to port it to the Fujitsu VP2200
vector processor using the VCC vectorising C compiler.
MUME is made available to research institutions on media/doc/postage cost
arrangements. Information on how to acquire it may be obtained by writing
(or email) to:
Marwan Jabri
SEDAL
Sydney University Electrical Engineering
NSW 2006 Australia
marwan@sedal.su.oz.au
19. LVQ_PAK, SOM_PAK
These are packages for Learning Vector Quantization and
Self-Organizing Maps, respectively.
They have been built by the LVQ/SOM Programming Team of the
Helsinki University of Technology, Laboratory of Computer and
Information Science, Rakentajanaukio 2 C, SF-02150 Espoo, FINLAND
There are versions for Unix and MS-DOS available from
cochlea.hut.fi (130.233.168.48) in
/pub/lvq_pak/lvq_pak-2.1.tar.Z (340 kB, Unix)
/pub/lvq_pak/lvq_p2r1.exe (310 kB, MS-DOS self-extract archive)
/pub/som_pak/som_pak-1.1.tar.Z (246 kB, Unix)
/pub/som_pak/som_p1r1.exe (215 kB, MS-DOS self-extract archive)
For some of these simulators there are user mailing lists. Get the
packages and look into their documentation for further info.
If you are using a small computer (PC, Mac, etc.) you may want to have
a look at the Central Neural System Electronic Bulletin Board
(see Answer 14)
Modem: 509-627-6CNS; Sysop: Wesley R. Elsberry;
P.O. Box 1187, Richland, WA 99352; welsberr@sandbox.kenn.wa.us
There are lots of small simulator packages, the CNS ANNSIM file set.
There is an ftp mirror site for the CNS ANNSIM file set at
me.uta.edu (129.107.2.20) in the /pub/neural directory. Most ANN
offerings are in /pub/neural/annsim.
------------------------------------------------------------------------
-A16.) Commercial software packages for NN simulation ?
[preliminary]
[who will write some short comment on each of the most
important packages ?]
The Number 1 of each volume of the journal "Neural Networks" has a list
of some dozens of commercial suppliers of Neural Network things:
Software, Hardware, Support, Programming, Design and Service.
Here is a naked list of names of Simulators running on PC (and, partly,
some other platforms, too):
1. NeuralWorks Professional 2+
2. AIM
3. BrainMaker Professional
4. Brain Cel
5. Neural Desk
6. Neural Case
7. Neuro Windows
8. Explorenet 3000
------------------------------------------------------------------------
-A17.) Neural Network hardware ?
[preliminary]
[who will write some short comment on the most important
HW-packages and chips ?]
The Number 1 of each volume of the journal "Neural Networks" has a list
of some dozens of suppliers of Neural Network support:
Software, Hardware, Support, Programming, Design and Service.
Here is a list of companies contributed by xli@computing-maths.cardiff.ac.uk:
1. HNC, INC.
5501 Oberlin Drive
San Diego
California 92121
(619) 546-8877
and a second address at
7799 Leesburg Pike, Suite 900
Falls Church, Virginia
22043
(703) 847-6808
Note: Australian Dist.: Unitronics
Tel : (09) 4701443
Contact: Martin Keye
HNC markets:
'Image Document Entry Processing Terminal' - it recognises
handwritten documents and converts the info to ASCII.
'ExploreNet 3000' - a NN demonstrator
'Anza/DP Plus'- a Neural Net board with 25MFlop or 12.5M peak
interconnects per second.
2. SAIC (Sience Application International Corporation)
10260 Campus Point Drive
MS 71, San Diego
CA 92121
(619) 546 6148
Fax: (619) 546 6736
3. Micro Devices
30 Skyline Drive
Lake Mary
FL 32746-6201
(407) 333-4379
MicroDevices makes MD1220 - 'Neural Bit Slice'
Each of the products mentioned sofar have very different usages.
Although this sounds similar to Intel's product, the
architectures are not.
4. Intel Corp
2250 Mission College Blvd
Santa Clara, Ca 95052-8125
Attn ETANN, Mail Stop SC9-40
(408) 765-9235
Intel is making an experimental chip:
80170NW - Electrically trainable Analog Neural Network (ETANN)
It has 64 'neurons' on it - almost fully internally connectted
and the chip can be put in an hierarchial architecture to do 2 Billion
interconnects per second.
Support software has already been made by
California Scientific Software
10141 Evening Star Dr #6
Grass Valley, CA 95945-9051
(916) 477-7481
Their product is called 'BrainMaker'.
5. NeuralWare, Inc
Penn Center West
Bldg IV Suite 227
Pittsburgh
PA 15276
They only sell software/simulator but for many platforms.
6. Tubb Research Limited
7a Lavant Street
Peterfield
Hampshire
GU32 2EL
United Kingdom
Tel: +44 730 60256
7. Adaptive Solutions Inc
1400 NW Compton Drive
Suite 340
Beaverton, OR 97006
U. S. A.
Tel: 503 - 690 - 1236
FAX: 503 - 690 - 1249
------------------------------------------------------------------------
-A19.) Databases for experimentation with NNs ?
[are there any more ?]
1. The nn-bench Benchmark collection
accessible via anonymous FTP on
"pt.cs.cmu.edu"
in directory
"/afs/cs/project/connect/bench"
or via the Andrew file system in the directory
"/afs/cs.cmu.edu/project/connect/bench"
In case of problems email contact is "nn-bench-request@cs.cmu.edu".
The data sets in this repository include the 'nettalk' data, the
'two spirals' problem, a vowel recognition task, and a few others.
2. UCI machine learning database
accessible via anonymous FTP on
"ics.uci.edu" [128.195.1.1]
in directory
"/pub/machine-learning-databases"
3. NIST special databases of the National Institute Of Standards
And Technology:
NIST special database 2:
Structured Forms Reference Set (SFRS)
The NIST database of structured forms contains 5,590 full page images
of simulated tax forms completed using machine print. THERE IS NO REAL
TAX DATA IN THIS DATABASE. The structured forms used in this database
are 12 different forms from the 1988, IRS 1040 Package X. These
include Forms 1040, 2106, 2441, 4562, and 6251 together with Schedules
A, B, C, D, E, F and SE. Eight of these forms contain two pages or
form faces making a total of 20 form faces represented in the
database. Each image is stored in bi-level black and white raster
format. The images in this database appear to be real forms prepared
by individuals but the images have been automatically derived and
synthesized using a computer and contain no "real" tax data. The entry
field values on the forms have been automatically generated by a
computer in order to make the data available without the danger of
distributing privileged tax information. In addition to the images
the database includes 5,590 answer files, one for each image. Each
answer file contains an ASCII representation of the data found in the
entry fields on the corresponding image. Image format documentation
and example software are also provided. The uncompressed database
totals approximately 5.9 gigabytes of data.
NIST special database 3:
Binary Images of Handwritten Segmented Characters (HWSC)
Contains 313,389 isolated character images segmented from the
2,100 full-page images distributed with "NIST Special Database 1".
223,125 digits, 44,951 upper-case, and 45,313 lower-case character
images. Each character image has been centered in a separate
128 by 128 pixel region, error rate of the segmentation and
assigned classification is less than 0.1%.
The uncompressed database totals approximately 2.75 gigabytes of
image data and includes image format documentation and example software.
NIST special database 4:
8-Bit Gray Scale Images of Fingerprint Image Groups (FIGS)
The NIST database of fingerprint images contains 2000 8-bit gray scale
fingerprint image pairs. Each image is 512 by 512 pixels with 32 rows
of white space at the bottom and classified using one of the five
following classes: A=Arch, L=Left Loop, R=Right Loop, T=Tented Arch,
W=Whirl. The database is evenly distributed over each of the five
classifications with 400 fingerprint pairs from each class. The images
are compressed using a modified JPEG lossless compression algorithm
and require approximately 636 Megabytes of storage compressed and 1.1
Gigabytes uncompressed (1.6 : 1 compression ratio). The database also
includes format documentation and example software.
More short overview:
Special Database 1 - NIST Binary Images of Printed Digits, Alphas, and Text
Special Database 2 - NIST Structured Forms Reference Set of Binary Images
Special Database 3 - NIST Binary Images of Handwritten Segmented Characters
Special Database 4 - NIST 8-bit Gray Scale Images of Fingerprint Image Groups
Special Database 6 - NIST Structured Forms Reference Set 2 of Binary Images
Special Database 7 - NIST Test Data 1: Binary Images of Handprinted Segmented
Characters
Special Software 1 - NIST Scoring Package Release 1.0
Special Database 1 - $895.00
Special Database 2 - $250.00
Special Database 3 - $895.00
Special Database 4 - $250.00
Special Database 6 - $250.00
Special Database 7 - $1,000.00
Special Software 1 - $1,150.00
The system requirements for all databases are a 5.25" CD-ROM drive
with software to read ISO-9660 format.
Contact: Darrin L. Dimmick
dld@magi.ncsl.nist.gov (301)975-4147
If you wish to order the database, please contact:
Standard Reference Data
National Institute of Standards and Technology
221/A323
Gaithersburg, MD 20899
(301)975-2208 or (301)926-0416 (FAX)
4. CEDAR CD-ROM 1: Database of Handwritten
Cities, States, ZIP Codes, Digits, and Alphabetic Characters
The Center Of Excellence for Document Analysis and Recognition (CEDAR)
State University of New York at Buffalo announces the availability of
CEDAR CDROM 1: USPS Office of Advanced Technology
The database contains handwritten words and ZIP Codes
in high resolution grayscale (300 ppi 8-bit) as well as
binary handwritten digits and alphabetic characters (300 ppi
1-bit). This database is intended to encourage research in
off-line handwriting recognition by providing access to
handwriting samples digitized from envelopes in a working
post office.
Specifications of the database include:
+ 300 ppi 8-bit grayscale handwritten words (cities,
states, ZIP Codes)
o 5632 city words
o 4938 state words
o 9454 ZIP Codes
+ 300 ppi binary handwritten characters and digits:
o 27,837 mixed alphas and numerics segmented
from address blocks
o 21,179 digits segmented from ZIP Codes
+ every image supplied with a manually determined
truth value
+ extracted from live mail in a working U.S. Post
Office
+ word images in the test set supplied with dic-
tionaries of postal words that simulate partial
recognition of the corresponding ZIP Code.
+ digit images included in test set that simulate
automatic ZIP Code segmentation. Results on these
data can be projected to overall ZIP Code recogni-
tion performance.
+ image format documentation and software included
System requirements are a 5.25" CD-ROM drive with software to read ISO-
9660 format.
For any further information, including how to order the
database, please contact:
Jonathan J. Hull, Associate Director, CEDAR, 226 Bell Hall
State University of New York at Buffalo, Buffalo, NY 14260
hull@cs.buffalo.edu (email)
------------------------------------------------------------------------
That's all folks.
========================================================================
Acknowledgements: Thanks to all the people who helped to get the stuff
above into the posting. I cannot name them all, because
I would make far too many errors then. :->
No ? Not good ? You want individual credit ?
OK, OK. I'll try to name them all. But: no guarantee....
THANKS FOR HELP TO:
(in alphabetical order of email adresses, I hope)
S.Taimi Ames <ames@reed.edu>
anderson@atc.boeing.com
Kim L. Blackwell <avrama@helix.nih.gov>
Paul Bakker <bakker@cs.uq.oz.au>
Yijun Cai <caiy@mercury.cs.uregina.ca>
L. Leon Campbell <campbell@brahms.udel.edu>
David DeMers <demers@cs.ucsd.edu>
Denni Rognvaldsson <denni@thep.lu.se>
Wesley R. Elsberry <elsberry@cse.uta.edu>
Frank Schnorrenberg <fs0997@easttexas.tamu.edu>
Gary Lawrence Murphy <garym@maya.isis.org>
gaudiano@park.bu.edu
Glen Clark <opto!glen@gatech.edu>
guy@minster.york.ac.uk
Jean-Denis Muller <jdmuller@vnet.ibm.com>
Jonathan Kamens <jik@MIT.Edu>
Luke Koops <koops@gaul.csd.uwo.ca>
William Mackeown <mackeown@compsci.bristol.ac.uk>
Peter Marvit <marvit@cattell.psych.upenn.edu>
Yoshiro Miyata <miyata@sccs.chukyo-u.ac.jp>
Jyrki Alakuijala <more@ee.oulu.fi>
mrs@kithrup.com
Michael Plonski <plonski@aero.org>
[myself]
Richard Cornelius <richc@rsf.atd.ucar.edu>
Rob Cunningham <rkc@xn.ll.mit.edu>
Osamu Saito <saito@nttica.ntt.jp>
Ted Stockwell <ted@aps1.spa.umn.edu>
Thomas.Vogel@cl.cam.ac.uk
Ulrich Wendl <uli@unido.informatik.uni-dortmund.de>
Matthew P Wiener <weemba@sagi.wistar.upenn.edu>
Bye
Lutz
--
Lutz Prechelt (email: prechelt@ira.uka.de) | Whenever you
Institut fuer Programmstrukturen und Datenorganisation | complicate things,
Universitaet Karlsruhe; D-7500 Karlsruhe 1; Germany | they get
(Voice: ++49/721/608-4317, FAX: ++49/721/694092) | less simple.
Xref: bloom-picayune.mit.edu news.newusers.questions:11874 news.answers:4766
Path: bloom-picayune.mit.edu!senator-bedfellow.mit.edu!athena.mit.edu!jik
From: jik@athena.mit.edu (Jonathan I. Kamens)
Newsgroups: news.newusers.questions,news.answers
Subject: Welcome to news.newusers.questions! (weekly posting)
Supersedes: <news-newusers-intro_724572073@athena.mit.edu>
Followup-To: news.newusers.questions
Date: 24 Dec 1992 06:01:19 GMT
Organization: Massachusetts Institute of Technology
Lines: 305
Approved: news-answers-request@MIT.Edu
Distribution: world
Expires: 14 Jan 1993 06:01:12 GMT
Message-ID: <news-newusers-intro_725176872@athena.mit.edu>
NNTP-Posting-Host: pit-manager.mit.edu
Summary: READ THIS BEFORE POSTING TO THIS NEWSGROUP
X-Version: $Id: news-newusers-intro,v 1.14 1992/12/21 21:59:09 jik Exp $
Archive-name: news-newusers-intro
Version: $Id: news-newusers-intro,v 1.14 1992/12/21 21:59:09 jik Exp $
Welcome to the news.newusers.questions newsgroup! According to the
"List of Active Newsgroups" posting in news.announce.newusers, the
purpose of this newsgroup is "Q & A for users new to the Usenet." So
if you've got questions about the USENET, this is the place to post
them!
Get to know news.announce.newusers.
However, before you do that, there is another newsgroup with which
you should become acquainted. The news.announce.newusers newsgroup
contains (once again according to the "List of Active Newsgroups"
posting) "Explanatory postings for new users." Its purpose is to
provide a base set of information with which all participants in the
USENET should be familiar in order to make the USENET a better place
for all of us.
If you have not already done so, you are strongly encouraged to read
the introductory postings in news.announce.newusers before posting any
messages to any newsgroup. In particular, the following postings in
that newsgroup might be considered the "mandatory course" for new
users:
A Primer on How to Work With the Usenet Community
Answers to Frequently Asked Questions about Usenet
Emily Postnews Answers Your Questions on Netiquette
Hints on writing style for Usenet
Rules for posting to Usenet
What is Usenet?
Furthermore, you should be aware that the following articles exist in
the newsgroup so that you can use them for reference in the future
(you might want to glance at them now so you have a slight familiarity
with their contents):
A Guide to Social Newsgroups and Mailing Lists
Alternative Newsgroup Hierarchies, Part I
Alternative Newsgroup Hierarchies, Part II
How to Create a New Usenet Newsgroup
How to Get Information about Networks
Introduction to news.announce
Introduction to the news.answers newsgroup
List of Active Newsgroups, Part I
List of Active Newsgroups, Part II
List of Moderators for Usenet
List of Periodic Informational Postings, Part 1/4
List of Periodic Informational Postings, Part 2/4
List of Periodic Informational Postings, Part 3/4
List of Periodic Informational Postings, Part 4/4
Publicly Accessible Mailing Lists, Part I
Publicly Accessible Mailing Lists, Part II
Publicly Accessible Mailing Lists, Part III
Publicly Accessible Mailing Lists, Part IV
Regional Newsgroup Hierarchies, Part I (*)
Regional Newsgroup Hierarchies, Part II (*)
Regional Newsgroup Hierarchies, Part III (*)
USENET Software: History and Sources
(*) Note that as of December 21, 1992, the "Regional Newsgroup
Hierarchies" postings are unavailable. However, their maintainer
does plan to resume posting them at some point in the future,
hence their inclusion in this list.
Finally, note that before posting in any newsgroup, you should read
the group for a while in order to become familiar with what is
acceptable in it and to make sure that you have seen and read the FAQ
posting(s) for the newsgroup, if there is/are any.
The articles in news.announce.newusers are posted in such a way that
each version should stay around at each site until the new version is
posted. However, some sites are configured incorrectly so that this
does not occur. If the articles listed above do not appear in the
news.announce.newusers newsgroup at your site, you can get copies of
them using the instructions appended to the end of this message.
Get to know news.answers.
The news.answers newsgroup contains a collection of all (well, it's
*supposed* to contain all of them, but it's still missing quite a few)
of the periodic informational postings that appear on the USENET
(including, for example, this posting, as well as most or all of the
news.announce.newusers postings listed above). You probably won't
want to sit down and read through every posting in news.answers from
beginning to end, especially since postings will be repeated
periodically and many of them will concern topics in which you have no
interest. However, news.answers is a good place to "browse the wealth
of the USENET." Since the periodic informational postings in the
various newsgroups tend to be the "distilled wisdom" of those
newsgroups, news.answers might be considered the distilled wisdom of
the USENET. Glancing at the articles in it will give you a good idea
of the breadth of information embodied in the USENET.
Get to know what other newsgroups are out there.
There is no well-defined limit on what questions belong in this
newsgroup and what questions do not. However, it is to your advantage
to know when there is a more appropriate newsgroup for you to post
particular questions in, because when you choose the appropriate
newsgroup, more people who can answer your question will see it.
For example, if you have a question about a UNIX command and that
question is not related to the USENET or to accessing NetNews, it
would probably be more appropriate in comp.unix.questions than in
news.newusers.questions. Furthermore, since many experienced UNIX
users read comp.unix.questions, you are more likely to get a useful
response if you post there.
It's often a good idea to try to get help locally.
Many questions that are asked by new USENET users concern details
about their particular site that no one else is going to know about.
Furthermore, new users often don't know what information to provide
when asking their questions, so several exchanges are necessary before
the people helping out have enough information to be able to give a
conclusive answer. Finally, it's often easier to learn something when
you're a new user by having it shown to you, in person, while sitting
in front of a terminal.
For this reason, it is often a good idea to try to get help locally
with your questions before you post them to news.newusers.questions
(or to any other newsgroup, for that matter). After you've been
participating in the USENET for a while, you'll get an intuitive feel
for what questions really belong in postings, and what questions are
probably answerable by someone at your site. If you don't feel you've
reached that point, it's probably a good idea to try to get answers to
pretty much ALL of your questions locally before posting.
Note that "getting help locally" includes checking available local
documentation for whatever you are trying to do. If you are having a
problem with the "rn" newsreader, for example, try looking for
information in the rn man page (type "man rn", and if it doesn't work
find someone who knows what's going on and ask them why "man rn"
doesn't display the rn man page).
Remember that posting to the USENET uses resources. You may not pay
for your posting, but other people are. If you post a question that
people outside your site CAN'T answer, or that people inside your site
CAN answer with minimal effort, the resources consumed by your posting
were consumed needlessly.
If you DO decide to ask a question in news.newusers.questions...
If, after checking the postings in news.announce.newusers to see if
your question is answered there, and after looking to see if there is
a more appropriate group in which to post it, and after trying to get
help locally, you still think your question belongs in
news.newusers.questions, then go right ahead and post it.
However, you should keep in mind when preparing your question that
the people who will be reading it and trying to help you are not
mind-readers. We don't know what your site is like. There are
thousands of sites on the USENET, and they're all just a little bit
different, so the more details you can provide when asking your
question, the more likely it is that people will be able to help you.
Try to provide the following information when posting a question.
If you don't know the answers to some of these questions, then try to
find them out from someone at your site and save them so that you can
use them when posting questions in the future:
1. What kind of machine are you working on? For example: Macintosh,
VAX, DECstation, IBM PC, PC compatible (which one), Cray, RS/6000.
2. What operating system is it running? For example: MacOS, MS-DOS,
UNIX, VM/CMS, VMS.
3. What version of the operating system? For example: MacOS 7.0,
Ultrix 4.2 UNIX, BSD 4.3, etc.
4. What news reader (or whatever program you are having trouble with)
are you using? What command do you type to start up whatever
program is giving you trouble?
5. What version of the program is it?
6. If you are trying to interpret some sort of error, what exactly did
you type to provoke the error, what was the exact error, and how is
the actual error different from what you expected to happen? For
example, if you're trying to figure out why a mail message bounced,
what address did you send the mail to, and what error message came
back in the bounced message?
7. What have you done to try to find the answer to your question
before posting? If you've tried different possible answers
already, exactly what have you tried, and what was the result?
8. If you have checked the documentation and cannot understand the